STATS 300 A : Theory of Statistics Fall 2015 Lecture 10 — October 22

نویسندگان

  • Bryan He
  • Rahul Makhijani
چکیده

X1, . . . , Xn iid ∼ N (θ, σ), with σ known. Our goal is to estimate θ under squared-error loss. For our first guess, pick the natural estimator X. Note that it has constant risk σ 2 n , which suggests minimaxity because we know that Bayes estimators with constant risk are also minimax estimators. However, X is not Bayes for any prior, because under squared-error loss unbiased estimators are Bayes estimators only in the degenerate situations of zero risk (TPE Theorem 4.2.3), and X is unbiased. Thus, we cannot conclude by our previous results (e.g., TPE Corollary 5.1.5) that X is minimax. We might try to consider the wider class of estimators δa,μ0 (X) = aX + (1− a)μ0 for a ∈ (0, 1) and μ0 ∈ R, because many of the Bayes estimators we’ve encountered are convex combinations of a prior and a data mean. Note however that the worst case risk for these estimators is infinite:

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

STATS 300 A : Theory of Statistics Fall 2015 Lecture 11 — October 27

This lemma allows us to find a minimax estimator for a particular tractable submodel, and then show that the worst-case risk for the full model is equal to that of the submodel (that is, the worst-case risk doesn’t rise as you go to the full model). In this case, using the Lemma, we can argue that the estimator we found is also minimax for the full model. This was similar to how we justified mi...

متن کامل

STATS 300 A : Theory of Statistics Fall 2015 Lecture 3 —

Before discussing today’s topic matter, let’s take a step back and situate ourselves with respect to the big picture. As mentioned in Lecture 1, a primary focus of this course is optimal inference. As a first step toward reasoning about optimality, we began to examine which statistics of the data that we observe are actually relevant in a given inferential task. We learned about lossless data r...

متن کامل

Nachdiplom Lecture : Statistics meets Optimization Fall 2015 Lecture 1 – Tuesday , September 22

In the modern era of high-dimensional data, the interface of mathematical statistics and optimization has become an increasing vibrant area of research. The goal of these lectures is to touch on various evolving areas at this interface. Before going into the details proper, let’s consider some high-level ways in which the objectives of optimization can be influenced by underlying statistical ob...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2015